Перевод: со всех языков на русский

с русского на все языки

output weight vector

См. также в других словарях:

  • Support vector machine — Support vector machines (SVMs) are a set of related supervised learning methods used for classification and regression. Viewing input data as two sets of vectors in an n dimensional space, an SVM will construct a separating hyperplane in that… …   Wikipedia

  • Comparison of vector graphics editors — A number of vector graphics editors for various platforms exist. Potential users of these editors will make a decision based on factors such as the availability for the user s platform, the feature set, usability of the user interface (UI) and… …   Wikipedia

  • Synaptic weight — In neuroscience and computer science, synaptic weight refers to the strength or amplitude of a connection between two nodes, corresponding in biology to the amount of influence the firing of one neuron has on another. The term is typically used… …   Wikipedia

  • Perceptron — Perceptrons redirects here. For the book of that title, see Perceptrons (book). The perceptron is a type of artificial neural network invented in 1957 at the Cornell Aeronautical Laboratory by Frank Rosenblatt.[1] It can be seen as the simplest… …   Wikipedia

  • Eigenvalues and eigenvectors — For more specific information regarding the eigenvalues and eigenvectors of matrices, see Eigendecomposition of a matrix. In this shear mapping the red arrow changes direction but the blue arrow does not. Therefore the blue arrow is an… …   Wikipedia

  • Oja's rule — Oja s learning rule, or simply Oja s rule, named after a Finnish computer scientist Erkki Oja, is a model of how neurons in the brain or in artificial neural networks change connection strength, or learn, over time. It is a modification of the… …   Wikipedia

  • Self-organizing map — A self organizing map (SOM) is a type of artificial neural network that is trained using unsupervised learning to produce a low dimensional (typically two dimensional), discretized representation of the input space of the training samples, called …   Wikipedia

  • Adaptive resonance theory — (ART) is a neural network architecture developed by Stephen Grossberg and Gail Carpenter. Learning model The basic ART system is an unsupervised learning model. It typically consists of a comparison field and a recognition field composed of… …   Wikipedia

  • ADALINE — (Adaptive Linear Neuron or later Adaptive Linear Element) is a single layer neural network. It was developed by Professor Bernard Widrow and his graduate student Ted Hoff at Stanford University in 1960. It is based on the McCulloch Pitts neuron.… …   Wikipedia

  • Linear classifier — In the field of machine learning, the goal of classification is to group items that have similar feature values, into groups. A linear classifier achieves this by making a classification decision based on the value of the linear combination of… …   Wikipedia

  • Least-squares spectral analysis — (LSSA) is a method of estimating a frequency spectrum, based on a least squares fit of sinusoids to data samples, similar to Fourier analysis. [cite book | title = Variable Stars As Essential Astrophysical Tools | author = Cafer Ibanoglu |… …   Wikipedia

Поделиться ссылкой на выделенное

Прямая ссылка:
Нажмите правой клавишей мыши и выберите «Копировать ссылку»